Goto

Collaborating Authors

 low rank conditional probability table


Learning Bayesian Networks with Low Rank Conditional Probability Tables

Neural Information Processing Systems

In this paper, we provide a method to learn the directed structure of a Bayesian network using data. The data is accessed by making conditional probability queries to a black-box model. We introduce a notion of simplicity of representation of conditional probability tables for the nodes in the Bayesian network, that we call ` low rank` Bayesian network using very few queries. We formally prove that our method correctly recovers the true directed structure, runs in polynomial time and only needs polynomial samples with respect to the number of nodes. We also provide further improvements in efficiency if we have access to some observational data.


Reviews: Learning Bayesian Networks with Low Rank Conditional Probability Tables

Neural Information Processing Systems

The paper introduces a notion of low rank conditional probability tables (CPTs). Overall the reviewers found the results innovative, interesting and theoretically justified. Many of the reviewers concerns were properly addressed in the rebuttal. Several reviewers stated that more empirical work could have greatly benefit the paper.


Reviews: Learning Bayesian Networks with Low Rank Conditional Probability Tables

Neural Information Processing Systems

This paper presents a method for structural learning of a BN given observational data. The work is mainly theoretical, and for the proposal some assumptions are taken. A great effort is also given in presenting and develop theoretically the complexity of the algorithm. One of the key points in the proposed algorithm is the use of Fourier basis vectors (coefficients) and how they are applied in the compressed sensing step. I haven't checked thoroughly all the mathematical part, which is the core of the paper.


Learning Bayesian Networks with Low Rank Conditional Probability Tables

Neural Information Processing Systems

In this paper, we provide a method to learn the directed structure of a Bayesian network using data. The data is accessed by making conditional probability queries to a black-box model. We introduce a notion of simplicity of representation of conditional probability tables for the nodes in the Bayesian network, that we call low rankness''. We connect this notion to the Fourier transformation of real valued set functions and propose a method which learns the exact directed structure of alow rank Bayesian network using very few queries. We formally prove that our method correctly recovers the true directed structure, runs in polynomial time and only needs polynomial samples with respect to the number of nodes.


Learning Bayesian Networks with Low Rank Conditional Probability Tables

Barik, Adarsh, Honorio, Jean

Neural Information Processing Systems

In this paper, we provide a method to learn the directed structure of a Bayesian network using data. The data is accessed by making conditional probability queries to a black-box model. We introduce a notion of simplicity of representation of conditional probability tables for the nodes in the Bayesian network, that we call low rankness''. We connect this notion to the Fourier transformation of real valued set functions and propose a method which learns the exact directed structure of a low rank Bayesian network using very few queries. We formally prove that our method correctly recovers the true directed structure, runs in polynomial time and only needs polynomial samples with respect to the number of nodes.